33 research outputs found

    Comparison of Empirical Propagation Path Loss Models for Mobile Communication

    Get PDF
    Empirical propagation models have found favor in both research and industrial communities owing to their speed of execution and their limited reliance on detailed knowledge of the terrain. In mobile communication the accuracy prediction of path losses is a crucial element during network planning and optimization. However, the existence of multiple propagation models means that there is no propagation model which is precisely and accurate in prediction of path loss fit for every environs other than in which they were designed. This paper presents few empirical models suitable for path loss prediction in mobile communication. Experimental measurements of received power for the 900 MHz GSM system are made in urban, suburban, and rural areas of Dar es Salaam, Tanzania. Measured data are compared with those obtained by five prediction models: Stanford University Interim (SUI) models [1], the COST-231 Hata model [2], the ECC-33 model [3], the ERICSSON model [4], and the HATA-OKUMURA model [5]. The results show that in general the SUI, COST-231, ERICSSON, and Hata-Okumura under-predict the path loss in all environments, while the ECC-33 model shows the best results, especially in suburban and over-predict pathloss in urban area. Keywords: Propagation pathloss, empirical models, radio coverage, mobile communications

    Improved handover decision scheme for 5g mm-wave communication: optimum base station selection using machine learning approach.

    Get PDF
    A Thesis Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Information and Communication Science and Engineering of the Nelson Mandela African Institution of Science and TechnologyThe rapid growth in mobile and wireless devices has led to an exponential demand for data traf fic and exacerbated the burden on conventional wireless networks. Fifth generation (5G) and beyond networks are expected to not only accommodate this growth in data demand but also provide additional services beyond the capability of existing wireless networks, while main taining a high quality-of-experience (QoE) for users. The need for several orders of magnitude increase in system capacity has necessitated the use of millimetre wave (mm-wave) frequencies as well as the proliferation of low-power small cells overlaying the existing macro-cell layer. These approaches offer a potential increase in throughput in magnitudes of several gigabits per second and a reduction in transmission latency, but they also present new challenges. For exam ple, mm-wave frequencies have higher propagation losses and a limited coverage area, thereby escalating mobility challenges such as more frequent handovers (HOs). In addition, the ad vent of low-power small cells with smaller footprints also causes signal fluctuations across the network, resulting in repeated HOs (ping-pong) from one small cell (SC) to another. Therefore, efficient HO management is very critical in future cellular networks since frequent HOs pose multiple threats to the quality-of-service (QoS), such as a reduction in the system throughput as well as service interruptions, which results in a poor QoE for the user. How ever, HO management is a significant challenge in 5G networks due to the use of mm-wave frequencies which have much smaller footprints. To address these challenges, this work in vestigates the HO performance of 5G mm-wave networks and proposes a novel method for achieving seamless user mobility in dense networks. The proposed model is based on a double deep reinforcement learning (DDRL) algorithm. To test the performance of the model, a com parative study was made between the proposed approach and benchmark solutions, including a benchmark developed as part of this thesis. The evaluation metrics considered include system throughput, execution time, ping-pong, and the scalability of the solutions. The results reveal that the developed DDRL-based solution vastly outperforms not only conventional methods but also other machine-learning-based benchmark techniques. The main contribution of this thesis is to provide an intelligent framework for mobility man agement in the connected state (i.e HO management) in 5G. Though primarily developed for mm-wave links between UEs and BSs in ultra-dense heterogeneous networks (UDHNs), the proposed framework can also be applied to sub-6 GHz frequencies

    Multi-User Position based on Trajectories-Aware Handover Strategy for Base Station Selection with Multi-Agent Learning

    Get PDF
    This paper presents the optimal Base Station(BS) selection method for proactive decision handover(HO) in Millimeter-wave(mm-wave) wireless communication. Mm-wave spectrum suffers significantly from the high path-loss and blockage caused by either controlled or uncontrolled sources. While the primary purpose of utilizing mm-wave is to achieve a high data rate, the presence of obstacle degrade the overall system performance since the connection link between User(UE) and serving BS being intermittent. The repercussion of the sporadic link is an increased number of HO. To increase throughput, proactive HO and minimize unnecessary HO are considered as the solution, and this paper presents a solution based on Reinforcement Learning(RL) framework. The framework learns from multi UE trajectories, and smart-agent learns simultaneously using Multi-Agent RL(MARL) and mapping each trajectory's feature and respect Q-value in smart agent constructed from Artificial Neural Network(ANN). The numerical results show that the intelligent, learned agent minimizes the number of HO and also outperform heuristic HO strategy in terms of throughput

    Deep Reinforcement Learning based Handover Management for Millimeter Wave Communication

    Get PDF
    This research article published by the International Journal of Advanced Computer Science and Applications, Vol. 12, No. 2, 2021The Millimeter Wave (mm-wave) band has a broad-spectrum capable of transmitting multi-gigabit per-second date-rate. However, the band suffers seriously from obstruction and high path loss, resulting in line-of-sight (LOS) and non-line-of-sight (NLOS) transmissions. All these lead to significant fluctu-ation in the signal received at the user end. Signal fluctuations present an unprecedented challenge in implementing the fifth gen-eration (5G) use-cases of the mm-wave spectrum. It also increases the user’s chances of changing the serving Base Station (BS) in the process, commonly known as Handover (HO). HO events become frequent for an ultra-dense dense network scenario, and HO management becomes increasingly challenging as the number of BS increases. HOs reduce network throughput, and hence the significance of mm-wave to 5G wireless system is diminished without adequate HO control. In this study, we propose a model for HO control based on the offline reinforcement learning (RL) algorithm that autonomously and smartly optimizes HO decisions taking into account prolonged user connectivity and throughput. We conclude by presenting the proposed model’s performance and comparing it with the state-of-art model, rate based HO scheme. The results reveal that the proposed model decreases excess HO by 70%, thus achieving a higher throughput relative to the rates based HO scheme

    Handover Management in Dense Networks with Coverage Prediction from Sparse Networks

    Get PDF
    Millimeter Wave (mm-Wave) provides high bandwidth and is expected to increase the capacity of the network thousand-fold in the future generations of mobile communications. However, since mm-Wave is sensitive to blockage and incurs in a high penetration loss, it has increased complexity and bottleneck in the realization of substantial gain. Network densification, as a solution for sensitivity and blockage, increases handover (HO) rate, unnecessary and ping-pong HO’s, which in turn reduces the throughput of the network. On the other hand, to minimize the effect of increased HO rate, Time to Trigger (TTT) and Hysteresis factor (H) have been used in Long Term Evolution (LTE). In this paper, we primarily present two different networks based on Evolved NodeB (eNB) density: sparse and dense. As their name also suggests, the eNB density in the dense network is higher than the sparse network. Hence, we proposed an optimal eNB selection mechanism for 5G intra-mobility HO based on spatial information of the sparse eNB network. In this approach, User Equipment (UE) in the dense network is connected only to a few selected eNBs, which are delivered from the sparse network, in the first place. HO event occurs only when the serving eNB can no longer satisfy the minimum Signal-to-Noise Ratio (SNR) threshold. For the eNBs, which are deployed in the dense network, follow the conventional HO procedure. Results reveal that the HO rate is decreased significantly with the proposed approach for the TTT values between 0 ms to 256 ms while keeping the radio link failure (RLF) at an acceptable level; less than 2% for the TTT values between 0 ms to 160 ms. This study paves a way for HO management in the future 5G network

    Effect of tuberculosis infection on mortality of HIV-infected patients in Northern Tanzania.

    Get PDF
    BACKGROUND: TB and HIV are public health problems, which have a synergistic effect to each other. Despite the decreasing burden of these two diseases they still make a significant contribution to mortality. Tanzania is among the 30 high TB and HIV burden countries. METHODS: Routine data over 6 years from people living with HIV (PLHIV) attending health facilities in three regions of Northern Tanzania were analyzed, showing mortality trends from 2012 to 2017 for HIV and HIV/TB subpopulations. Poisson regression with frailty model adjusting for clustering at health facility level was used to analyze the data to determine mortality rate ratios (RR) and 95% confidence intervals (95%CI). RESULTS: Among all PLHIV the overall mortality rate was 28.4 (95% CI 27.6-29.2) deaths per 1000 person-years. For PLHIV with no evidence of TB the mortality rates was 26.2 (95% CI 25.4-27.0) per 1000 person-years, and for those with HIV/TB co-infection 57.8 (95% CI 55.6-62.3) per 1000 person-years. After adjusting for age, sex, residence, WHO stage, and bodyweight, PLHIV with TB co-infection had 40% higher mortality than those without TB (RR 1.4; 95% CI 1.24-1.67). CONCLUSIONS: Over the 6-year period mortality rates for HIV/TB patients were consistently higher than for PLHIV who have no TB. More efforts should be directed into improving nutritional status among HIV patients, as it has destructive interaction with TB for mortality. This will improve patients' body weight and CD4 counts which are protective against mortality. Among PLHIV attention should be given to those who are in WHO HIV stage 3 or 4 and having TB co-infection

    Incidence Rates for Tuberculosis Among HIV Infected Patients in Northern Tanzania.

    Get PDF
    Background: HIV and tuberculosis (TB) are leading infectious diseases, with a high risk of co-infection. The risk of TB in people living with HIV (PLHIV) is high soon after sero-conversion and increases as the CD4 counts are depleted. Methodology: We used routinely collected data from Care and Treatment Clinics (CTCs) in three regions in northern Tanzania. All PLHIV attending CTCs between January 2012 to December 2017 were included in the analysis. TB incidence was defined as cases started on anti-TB medications divided by the person-years of follow-up. Poisson regression with frailty models were used to determine incidence rate ratios (IRR) and 95% confidence intervals (95% CI) for predictors of TB incidences among HIV positive patients. Results: Among 78,748 PLHIV, 405 patients developed TB over 195,296 person-years of follow-up, giving an overall TB incidence rate of 2.08 per 1,000 person-years. There was an increased risk of TB incidence, 3.35 per 1,000 person-years, in hospitals compared to lower level health facilities. Compared to CD4 counts of <350 cells/ÎĽl, a high CD4 count was associated with lower TB incidence, 81% lower for a CD4 count of 350-500 cells/ÎĽl (IRR 0.19, 95% CI 0.04-0.08) and 85% lower for those with a CD4 count above 500 cells/ÎĽl (IRR 0.15, 95% CI 0.04-0.64). Independently, those taking ART had 66% lower TB incidences (IRR 0.34, 95% CI 0.15-0.79) compared to those not taking ART. Poor nutritional status and CTC enrollment between 2008 and 2012 were associated with higher TB incidences IRR 9.27 (95% CI 2.15-39.95) and IRR 2.97 (95% CI 1.05-8.43), respectively. Discussion: There has been a decline in TB incidence since 2012, with exception of the year 2017 whereby there was higher TB incidence probably due to better diagnosis of TB following a national initiative. Among HIV positive patients attending CTCs, poor nutritional status, low CD4 counts and not taking ART treatment were associated with higher TB incidence, highlighting the need to get PLHIV on treatment early, and the need for close monitoring of CD4 counts. Data from routinely collected and available health services can be used to provide evidence of the epidemiological risk of TB

    Enhancing access to genetic resources for climate change adaptation in Kenya, Uganda and Tanzania: Seed catalogues of best perfoming varieties of sorghum in Dodoma and Singida Tanzania

    Get PDF
    Climate change poses an increasing threat to food and nutrition security of resource-poor farmers globally. In Tanzania, homogenization of agriculture to single crops or varieties coupled with the associated loss of biodiversity has further decreased the resilience of resource-poor farmers. The loss of genetic diversity in farmers’ custody has greatly narrowed the gene pool from which they depend on. In order to help them adapt to climate change, the project “Promoting Open Source Seed Systems for Beans, Millet and Sorghum for Climate Change Adaptation” funded by the Benefit-sharing Fund (BSF) of the International Treaty on Plant Genetic Resources for Food and Agriculture (ITPGRFA) was implemented in Kenya, Uganda and Tanzania. Through this project, farmers in Dodoma and Singida in Tanzania tested and evaluated the performance of 24 varieties of sorghum for drought tolerance, yield, early maturity, pest and diseases and taste and selected 10 best performing. This catalogue presents these top selected varieties including their agronomic attributes and nutritional benefits

    Energy Optimisation Through Path Selection for Underwater Wireless Sensor Networks

    Get PDF
    This paper explores energy-efficient ways of retrieving data from underwater sensor fields using autonomous underwater vehicles (AUVs). Since AUVs are battery-powered and therefore energy-constrained, their energy consumption is a critical consideration in designing underwater wireless sensor networks. The energy consumed by an AUV depends on the hydrodynamic design, speed, on-board payload and its trajectory. In this paper, we optimise the trajectory taken by the AUV deployed from a floating ship to collect data from every cluster head in an underwater sensor network and return to the ship to offload the data. The trajectory optimisation algorithm models the trajectory selection as a stochastic shortest path problem and uses reinforcement learning to select the minimum cost path, taking into account that banked turns consume more energy than straight movement. We also investigate the impact of AUV speed on its energy consumption. The results show that our algorithm improves AUV energy consumption by up to 50% compared with the Nearest Neighbour algorithm for sparse deployments

    DEKCS: a dynamic clustering protocol to prolong underwater sensor networks

    Get PDF
    Energy consumption is a critical issue in the design of wireless underwater sensor networks (WUSNs). Data transfer in the harsh underwater channel requires higher transmission powers compared to an equivalent terrestrial-based network to achieve the same range. However, battery-operated underwater sensor nodes are energy-constrained and require that they transmit with low power to conserve power. Clustering is a technique for partitioning wireless networks into groups where a local base station (cluster head) is only one hop away. Due to the proximity to the cluster head, sensor nodes can lower their transmitting power, thereby improving the network energy efficiency. This paper describes the implementation of a new clustering algorithm to prolong the lifetime of WUSNs. We propose a new protocol called distance- and energy-constrained k-means clustering scheme (DEKCS) for cluster head selection. A potential cluster head is selected based on its position in the cluster and based on its residual battery level. We dynamically update the residual energy thresholds set for potential cluster heads to ensure that the network fully runs out of energy before it becomes disconnected. Also, we leverage the elbow method to dynamically select the optimal number of clusters according to the network size, thereby making the network scalable. Our evaluations show that the proposed scheme outperforms the conventional low-energy adaptive clustering hierarchy (LEACH) protocol by over 90% and an optimised version of LEACH based on k-means clustering by 42%
    corecore